Applied and Environmental Microbiology
● American Society for Microbiology
Preprints posted in the last 7 days, ranked by how well they match Applied and Environmental Microbiology's content profile, based on 301 papers previously published here. The average preprint has a 0.11% match score for this journal, so anything above that is already an above-average fit.
Yu, J.; Tillema, S.; Akel, M.; Aron, A.; Espinosa, E.; Fisher, S. A.; Branche, T. N.; Mithal, L. B.; Hartmann, E. M.
Show abstract
Benzalkonium chloride (BAC) is widely used as a disinfectant in cleaning products and is frequently detected in indoor dust. In this study, we assessed dust samples, along with information on cleaning product use, from 24 pregnant participants. Dust samples were analyzed for BAC concentration and microbial tolerance. Different chain lengths of BAC (C12, C14, and C16) were quantified using LC-MS/MS, and bacterial isolates were tested for BAC tolerance using minimum inhibitory concentration (MIC) assays. BAC was ubiquitously detected, with C12 and C14 being dominant. Higher BAC concentrations were associated with reported disinfectant use and increased microbial tolerance. These findings suggest that indoor antimicrobial use may promote microbial resistance, highlighting potential exposure risks in indoor environments and the need for further investigation into health and ecological impacts.
Paulos, A. P.; Zulli, A.; Duong, D.; Shelden, B.; White, B. J.; North, D.; Boehm, A. B.; Wolfe, M. K.
Show abstract
Respiratory infections caused by bacterial pathogens like Mycobacterium tuberculosis and Bordetella pertussis have increased since the COVID 19 pandemic, yet clinical surveillance of both suffers from underreporting and delayed diagnoses. Wastewater monitoring is a valuable public health surveillance tool that can help fill gaps in clinical data yet has rarely been applied to respiratory bacterial pathogens despite evidence of bacterial shedding via excretion types that enter wastewater. In this study, we investigated the possibility for wastewater monitoring of two bacterial respiratory diseases, tuberculosis and pertussis, using two case studies of wastewater monitoring for M. tuberculosis and B. pertussis. We retrospectively measured concentrations of these pathogens in wastewater samples collected longitudinally from communities with and without known outbreaks of these diseases. We designed and validated a novel B. pertussis specific assay for the NAD(P) gene; B. pertussis nucleic acids were detected sporadically in wastewater during an identified outbreak. We used a highly specific, established assay for M. tuberculosis nucleic acids, and found low concentrations of the marker in wastewater that were lag-correlated with clinical incidence rates 5 weeks later. Findings support the potential of wastewater monitoring for M. tuberculosis and B. pertussis to enable identification of communities with outbreaks of tuberculosis and pertussis and provide early warning for tuberculosis.
Shinto, H.; Chowell, G.; Takayama, Y.; Ohki, Y.; Saito, K.; Mizumoto, K.
Show abstract
BackgroundIn long-term care facilities (LTCFs), close-contact identification often relies on staff recall and monitoring records because residents may be unable to self-report reliably. How these different record-generation processes relate to proximity-based sensor measurements in routine LTCF workflow remain unclear, and how such differences may influence contact-based decision-making in outbreak response is not well understood. MethodsWe conducted a five-day observational study in a Japanese LTCF using ultra-wideband (UWB) indoor positioning. Twenty-seven participants wore UWB tags, including 16 residents and 11 staff members; 10 staff members completed questionnaires. We compared UWB-derived proximity with questionnaire-derived contacts from staff self-report and monitoring-based proxy records, and assessed directional discrepancies under multiple distance-time thresholds. ResultsQuestionnaire-based records and UWB-derived proximity showed different patterns of discrepancy across contact types. Within this facility, resident-related monitoring-based proxy records showed relatively small directional discrepancies, whereas staff self-reports tended to identify additional resident-staff contacts under the baseline threshold ([≤]1.0 m for [≥]15 min). Several alternative thresholds were associated with discrepancies closer to zero than the baseline, although the apparent ranking varied by summary metric. ConclusionsIn this single-facility observational study, different contact-list generation processes were associated with different patterns of discrepancy relative to a proximity-based operational measure. These findings support interpretation in terms of workflow-specific contact-list generation rather than a single universally optimal threshold and may help inform facility-level review of contact identification practices in LTCFs. These findings support aligning contact identification strategies with facility-specific workflows to improve the feasibility and effectiveness of IPC practices in LTCFs.
Rieger, C. D.; Molaeitabari, A.; Dahms, T. E. S.; El-Halfawy, O. M.
Show abstract
Standard in vitro antimicrobial susceptibility testing (AST) using Mueller-Hinton broth (MHB) does not reflect infection-site conditions, and its results often do not correlate with therapeutic outcomes. Here, we compared the antibiotic susceptibility of methicillin-resistant Staphylococcus aureus (MRSA), a common chronic wound pathogen, in simulated wound fluid (SWF) resembling wound exudate versus MHB, revealing discordant AST results across six of nine tested antibiotic classes. The most significant were 128-fold increased resistance to tetracyclines and 256-fold sensitization to {beta}-lactams in SWF. Tetracycline resistance was mediated by MntC, an extracellular manganese-binding protein, whereas {beta}-lactam sensitization was driven by cell envelope remodelling in SWF. Galleria mellonella wound infection results matched the SWF susceptibility phenotypes, suggesting SWF better predicts in vivo wound infection therapeutic outcomes. These comprehensive phenotypic and mechanistic insights into MRSA antibiotic responses under wound-infection-mimetic conditions with direct in vivo validation identify a potential new antibiotic adjuvant target and may guide improved antibiotic therapy for MRSA wound infections.
Ahmed, W.; Gebrewold, M.; Verhagen, R.; Koh, M.; Gazeley, J.; Levy, A.; Simpson, S.; Nolan, M.
Show abstract
Wastewater surveillance (WWS) is established as a vital tool for monitoring polio and SARS-CoV-2 with potential to improve surveillance for many other infectious diseases. This study evaluated the feasibility of detecting measles virus (MeV) RNA in wastewater as part of a national WS preparedness trial in Brisbane, Australia, from March to June 2025. Composite and passive sampling methods were employed in parallel at three wastewater treatment plants serving populations between 230,000 and 584,000. Nucleic acids were extracted and analyzed using RT-qPCR targeting MeV N and M genes to distinguish wild-type and vaccine strains. MeV RNA were detected in both 24-hour composite and passive samples on May 26 to 27, 2025 from the largest catchment of 584,000 which also included an international airport. No measles cases were reported in this city or region within 4 weeks of the WS detections. These were confirmed as vaccine-derived measles virus (MeVV) strain via specific RT-qPCR assay. Extraction recoveries varied (11.5% to 70.5%), with passive sampling showing higher efficiency. This is the first report of use of passive samples for detection of MeV. These findings are consistent with other studies reporting WWS results of both MeVV genotype A and wild type genotype B and/or D. It demonstrates the potential for sensitive MeV WWS with rapid differentiation of MeVV from wild type MeV shedding, including in airport transport hubs and with different sample types. Use of WWS could strengthen measles surveillance by enabling rapid detection of MeV RNA and supporting outbreak preparedness and response. This requires optimised methods which are specific to or differentiate wild-type MeV from MeVV. Furthermore, the successful detection of MeV using passive sampling in this study highlights its potential for deployment in diverse global contexts which may include non-sewered settings.
Gohari, M. R.; Zhang, P.; Villegas, A.; Rosella, L. C.; Patel, S. N.; Hopkins, J. P.; Duvvuri, V. R.
Show abstract
Antimicrobial resistance (AMR) is a growing global public health threat that complicates the treatment and control of bacterial infections. Shigella spp., a leading cause of bacterial diarrhea worldwide, has increasingly exhibited resistance to multiple antimicrobial agents that are commonly recommended therapy for severe shigellosis. Although conventional antimicrobial susceptibility testing (AST) remains the reference standard, it is time-consuming and provides limited insight into the genetic mechanisms underlying resistance. Whole-genome sequencing (WGS) has emerged as a complementary approach for AMR detection by enabling direct identification of resistance genetic determinants encoded in bacterial genomes. Machine learning (ML) methods applied to genomic features such as k-mers have shown promise for predicting resistance phenotypes from WGS data; however, applications to Shigella remain limited. In this study, we developed and evaluated an interpretable ML framework for predicting ciprofloxacin resistance using k-mer features derived from WGS data of 1,424 Shigella isolates collected in Ontario, Canada, between 2018 and 2025. K-mers were extracted from known gene targets associated with ciprofloxacin resistance, including chromosomal quinoline resistance-determining regions (QRDRs: gyrA and parC) and plasmid-mediated determinants (qnr). Supervised ML approaches were trained and compared. We evaluated the influence of k-mer lengths (k=11, 15, 21 and 31) on predictive performance and model interpretability; and compared models based on chromosomal determinants alone and models incorporating both chromosomal and plasmid-mediated determinants. Randon Forest classifier achieved the most consistent performance across models. Inclusion of plasmid-mediated determinants improved predictive accuracy relative to chromosomal-only models. Although differences across k-mer lengths were modest, k = 11 produced the highest area under the receiver operating characteristic curve (AUC) and the lowest Brier score. SHAP analyses localized high-impact features within QRDRs of gyrA and parC, supporting biological interpretability. These findings demonstrate that biologically-informed k-mer-based ML models can accurately and transparently predict ciprofloxacin resistance in Shigella, supporting their potential integration into genomic AMR surveillance and digital public health frameworks. Author summaryIn this study, we used genome sequencing data to develop machine learning models that predict ciprofloxacin resistance for Shigella directly from bacterial DNA. We focused on small DNA fragments (k-mers) derived from known resistance genes and mutations. Among the approaches tested, a Random Forest model showed the most consistent performance. Combining chromosomal mutations with plasmid-mediated resistance genes improved prediction accuracy and helped identify key genetic regions associated with resistance. These findings demonstrate that machine learning applied to genomic data can accurately and interpretable predict antibiotic resistance, supporting its potential use in genomic surveillance and public health monitoring.
Maphosa, T.; Machekano, R.; Denoeud-Ndam, L.; Makonokaya, L.; Chilikutali, L.; Kalitera, L. U.; Matiya, E.; Mayi, A.; Musarandega, R.; Matola, B. W.; Berhan, A. Y.; Ahimbisibwe, A.; Tiam, A.
Show abstract
BackgroundTo promote sustainability and strengthen national ownership of Advanced HIV Disease (AHD) services, a transition was implemented across 22 health facilities in Central Malawi. This transition involved shifting responsibility for key AHD program elements, including clinical service delivery, diagnostics, provider mentorship, and reporting systems, from implementing partner-led implementation to full Ministry of Health (MoH) leadership. This evaluation assessed the impact of this transition on diagnostic coverage, TB preventive therapy (TPT) uptake, and 12-month survival outcomes. MethodsA retrospective cohort study was conducted involving all children and adults enrolled in AHD care during the pre-MoH transition (January 2020-December 2021) and post-MoH transition (January 2023-December 2024) periods. Eligibility followed national AHD criteria: CD4 count <200 cells/mm3, WHO stage 3 or 4 illness, or age <5 years. AHD clients data were abstracted from clinical records and linked across routine facility registers to assess diagnostic and treatment indicators. Kaplan-Meier survival curves, Cox proportional hazards, and Fine and Gray competing risk models were used to evaluate 6 and 12-month mortality and retention as primary outcomes. ResultsA total of 1,044 AHD clients were included (553 pre-transition; 491 post-transition) in the evaluation. Median age increased post-transition (35.9 to 38.5 years, p<0.001). CD4 testing declined (80.7% to 46.0%, p<0.001) testing uptake, while WHO staging and TB diagnostic coverage improved. TB diagnoses decreased (44.5% to 31.2%, p=0.002). TPT uptake dropped from 46.4% to 31.6% (p<0.001). Twelve-month mortality significantly declined from 9.4% to 5.5% (adjusted hazard ratio [aHR]=0.59, 95% CI: 0.37-0.94, p=0.026). Retention in care remained stable (HR=0.86, 95% CI: 0.62- 1.20, p=0.384). ConclusionsTransitioning AHD services to MoH leadership sustained key program outcomes and significantly reduced mortality. Continued mentorship and government ownership were key drivers of success. However, declines in CD4 testing and TPT coverage highlight the need for strengthened diagnostics and preventive care integration. These findings support scaling nationally-led AHD models in high-burden HIV settings.
Bauman, A.; Owen, K.; Messing, S.; Macdonald, H.; Nettlefold, L.; Richards, J.; Vandelanotte, C.; Chen, I.-H.; Cullen, B.; van Buskirk, J.; van Itallie, A.; Coletta, G.; O'Halloran, P.; Randle, E.; Nicholson, M.; Staley, K.; McKay, H. A.
Show abstract
Military aviation training noise remains understudied despite its widespread impacts across urban, rural, and wilderness areas. The predominance of low-frequency noise and repetitive training can create pervasive noise pollution, yet past research often fails to capture the full range of health and quality-of-life effects. This study analyzed two complaint datasets related to Whidbey Island Naval Air Station noise: U.S. Navy records (2017-2020) and Quiet Skies Over San Juan County data (2021-2023). We analyzed and mapped sentiment intensity from noise complaints relative to modeled annual noise exposure, developed a typology to classify impacts, and modeled the environmental and operational factors influencing complaints. Findings revealed widespread negative sentiment and anger, often beyond the bounds of estimated noise contours, suggesting that annual cumulative noise models inadequately estimate community impacts. Complaints consistently highlighted sleep disturbance, hearing and health concerns, and compromised home environments due to shaking, vibration, and disruption of daily life. Residents also reported significant social, recreational, and work disruptions, along with feelings of fear, helplessness, and concern for children's well-being. The number of complaints were strongly associated with training schedules, with late-night sessions being the strongest predictor. A delayed response pattern suggests residents reach a frustration threshold before filing complaints. Overall, our findings demonstrate persistent negative sentiment and diverse impacts from military aviation noise. Results highlight the need for improved noise metrics, modeling and operational adjustments to mitigate the most disruptive effects.
Xiao, M.; Girard, Q.; Pender, M.; Rabezara, J. Y.; Rahary, P.; Randrianarisoa, S.; Rasambainarivo, F.; Rasolofoniaina, O.; Soarimalala, V.; Janko, M. M.; Nunn, C. L.
Show abstract
PurposeAntibiotic use (ABU) is a major driver of antimicrobial resistance (AMR), but ABU patterns are poorly understood in low-income countries where the burden of AMR is great and ABU is insufficiently regulated. Here, we report ABU from ten sites ranging from rural villages to small cities in Madagascar, a country with high AMR levels, and present results from modeling to identify factors that may be associated with ABU in this setting. MethodsWe conducted surveys of 290 individuals from ten sites in the SAVA Region of northeast Madagascar to gather data on sociodemographic characteristics, agricultural and animal husbandry practices, recent antibiotic use, the antibiotics that participants recalled using in their lifetimes, and the sources of their antibiotics. Using these data, we conducted statistical analyses with a mixed-effects logistic model to determine which characteristics were associated with recent antibiotic use. ResultsNearly all respondents (N=283, 97.6%) reported ABU in their lifetimes, with amoxicillin being the most widely reported antibiotic (N=255, 90.1% of those reporting ABU). All recalled antibiotics were classified as frontline drugs except for ciprofloxacin. Most respondents who reported antibiotic use also reported obtaining antibiotics without prescriptions from local stores (N=273, 96.5%), while only 52.3% (N=148) reported obtaining antibiotics through a prescriptive route, such as from a health clinic or private doctor. Of the 127 individuals (44.9%) who reported recent ABU, men were found to be significantly less likely to have recently taken antibiotics than women. ConclusionsOur findings provide new insights into ABU in agricultural settings in low-income countries, which have historically been understudied in AMR and pharmacoepidemiologic research. Knowledge of ABU patterns supports understanding of AMR dynamics and AMR control efforts in these contexts, such as interventions on inappropriate antibiotic dispensing. Key pointsO_LIAntibiotic use (ABU) in Madagascar is largely unstudied despite its role in antimicrobial resistance (AMR), which Madagascar faces a high burden of. C_LIO_LIABU was widespread among livestock owners in northeast Madagascar, with the majority of study participants reporting ABU in their lifetimes and most people reporting ABU also having taken antibiotics in the previous three months. C_LIO_LIMost respondents reported obtaining their antibiotics from non-pharmaceutical stores, indicating high levels of unregulated ABU, though more than half also reported sourcing their antibiotics through prescriptive means (like doctors and health clinics). C_LIO_LIMen were less likely than women to have taken antibiotics in the previous three months. C_LIO_LIThese findings support the development of interventions to mitigate the burden of AMR in Madagascar and similar contexts while underscoring the need for more comprehensive research on the drivers and patterns of ABU. C_LI Plain language summaryIn this study, we provide basic information on antibiotic use (ABU) patterns in Madagascar, a country that experiences high levels of resistance but has been particularly understudied in AMR and pharmacological research. We surveyed 290 farmers with livestock from ten sites across northeast Madagascar about their ABU and found that nearly all study participants (N=283, 97.6%) have used antibiotics in their lifetimes, while a little under half of those who reported ABU also reported using antibiotics in the previous three months (N=127, 44.9%). The most used antibiotic was amoxicillin (N=255, 90.1%). Most people obtained their antibiotics from sources that do not require prescriptions, like general stores, indicating that most ABU is unregulated. Through modeling, we also found that men were less likely than women to have taken antibiotics in the previous three months (OR=0.50, CI 0.30-0.82). These findings help us better understand the dynamics of ABU in low-income countries, which have historically been understudied in AMR and pharmacological research. They also support efforts to mitigate the burden of AMR by revealing ABU dynamics that may contribute to the emergence and spread of AMR, as well as identifying targets for intervention to curb inappropriate ABU.
Bastard, J.; Migne, C.; Helle, T.; Agneray, E.; Bigeard, C.; Boudjadi, Y.; Chevrier, M.; Dumarest, M.; Gondard, M.; Martin-Latil, S.; Mathews-Martin, L.; Petit, T.; Charpentier, T.; Pouillevet, H.; Durand, B.; Metras, R.; Gonzalez, G.
Show abstract
Zoos may serve as sentinel sites for zoonotic vector-borne diseases. West Nile virus (WNV) and Usutu virus (USUV) are closely related orthoflaviviruses transmitted between Culex mosquitoes and a bird reservoir. Both viruses can also infect mammals, including humans, where they may cause symptoms and, more rarely, hospitalization and death. However, serological cross-reactivity between WNV and USUV complicates their differential diagnosis. Here, we aimed to reconstruct the dynamics of emergence of WNV in a zoo located in a newly affected area in Europe, using ELISA and Virus Neutralization Test (VNT) serological analysis of 1707 animal sera collected between 2015 and 2024. Combining this data in a model accounting for cross-reactivity with USUV, we estimated yearly forces of infection (FOI) by both viruses, and thus found that WNV likely circulated in the area one year prior to the first cases reported to the passive surveillance system. Our results also showed that, in the zoo, mammals and reptiles had a lower risk of infection than birds (relative risk of 0.14 [0.05; 0.28]), and that the exposure of birds to water (aquatic lifestyle or proximity to stagnant water) affected the risk. Finally, we estimated diagnosis parameters, including the sensitivity of the VNT (80.4% [76.5%; 84.3%]), the expected VNT titer value, and the level of serological cross-reactivity between viruses during the VNT. To conclude, our modelling framework allowed to disentangle the co-circulation of two closely related viruses, a crucial point in ensuring the reliable sentinel surveillance of these vector-borne zoonotic pathogens.
Farre, R.; Salama, R.; Rodriguez-Lazaro, M. A.; Kiarostami, K.; Fernandez-Barat, L.; Oliveira, V. D. C.; Torres, A.; Farre, N.; Dinh-Xuan, A. T.; Gozal, D.; Otero, J.
Show abstract
BackgroundThe COVID-19 pandemic exposed critical shortages of mechanical ventilators, particularly in low-resource settings. Disruptions in global supply chains and dependence on specialized components highlighted the need for scalable, locally manufacturing alternatives for emergency respiratory support. AimTo describe and evaluate a simplified, supply-chain-independent mechanical ventilator assembled from widely available automotive and simple hardware components, and intended as a last-resort solution. MethodsThe ventilator is based on a reciprocating air pump driven by an automotive windshield wiper motor coupled to parallel shaft bellows and readily assembled passive membrane valves, only requiring materials available from standard hardware retailers, minimal tools, and basic manual skills. Ventilator performance was assessed through bench testing using a patient model simulating severe lung disease in an adult (R=20 cmH2O{middle dot}s/L, C=15 mL/cmH2O) and pediatric (R=50 cmH2O{middle dot}s/L, C=10 mL/cmH2O) patients. Realistic proof of concept was performed in four mechanically ventilated 50-kg pigs. ResultsThe device delivered tidal volumes up to 600 mL and respiratory rates up to 45 breaths/min with PEEP up to 10 cmH2O, covering pediatric and adult ventilation ranges. In vivo testing showed that the ventilator maintained arterial blood gases within the targeted range. Technical details for ventilator construction are provided in an open-source video tutorial. DiscussionThis low-cost ventilator demonstrated adequate performance under demanding conditions. Although not a substitute for commercial intensive care ventilators, its simplicity, autonomy, and independence from fragile supply chains provide a potentially life-saving option in resource-constrained emergency scenarios.
Welch, A. M.; Beseler, C. L.; Cross, S. T.
Show abstract
Purpose: Alpha-gal syndrome (AGS) is an emerging health issue. This syndrome, caused by the bites of ticks, induces allergic reactions to the sugar molecule galactose-alpha-1,3-galactose after exposure to non-primate mammalian meat and other byproducts. Agricultural workers spend significant time outdoors placing them at an increased risk for tick bites and tick-borne diseases, like AGS. This study aimed to characterize farmers and ranchers' prior knowledge, symptomology, and diagnostic experiences with AGS. Methods: We conducted a cross-sectional survey of more than 200 farmers and ranchers with a self-reported AGS diagnosis. The survey captured farmers and ranchers' experiences related to prior knowledge and experience with tick bites and AGS, reported symptoms, and obtaining a diagnosis. Findings: A total of 201 respondents across 26 states participated in the survey, with the majority from Missouri and Oklahoma. We identified four distinct symptom clusters, with the most reported symptoms being abdominal cramping, diarrhea, itchy skin, and nausea. Women more often reported gastrointestinal discomfort, and men were more likely to be in the mild symptom category. On average, participants reported 2.98 medical provider visits before receiving a diagnosis, most being diagnosed by general practitioners and allergists. Conclusions: No previous studies have focused on the symptom and diagnostic experiences of farmers and ranchers with AGS. Capturing such data is essential as these workers may experience unique occupational challenges following AGS diagnosis. The diagnostic experience data support a continuing need to educate and empower AGS patients and providers, especially agricultural workers and providers serving rural communities.
Chihara, A.; Mizuno, R.; Kagawa, N.; Takayama, A.; Okumura, A.; Suzuki, M.; Shibata, Y.; Mochii, M.; Ohuchi, H.; Sato, K.; Suzuki, K.-i. T.
Show abstract
Fluorescent in situ hybridization (FISH) enables highly sensitive, high-resolution detection of gene transcripts. Moreover, by employing multiple probes, this technique allows for multiplexed, simultaneous detection of distinct gene expression patterns spatiotemporally, making it a valuable spatial transcriptomics approach. Owing to these advantages, FISH techniques are rapidly being adopted across diverse areas of basic biology. However, conventional protocols often rely on volatile, toxic reagents such as formalin or methanol, posing potential health risks to researchers. Here, we present a safer protocol that replaces these chemicals with low-toxicity alternatives, without compromising the high detection sensitivity of FISH. We validated this protocol using both in situ hybridization chain reaction (HCR) and signal amplification by exchange reaction (SABER)-FISH in frozen sections of various model organisms, including mouse (Mus musculus), amphibians (Xenopus laevis and Pleurodeles waltl), and medaka (Oryzias latipes). Our results demonstrate successful multiplexed detection of morphogenetic and cell-type marker genes in these model animals using this safer protocol. The protocol has the additional advantage of requiring no proteolytic enzyme treatment, thus preserving tissue integrity. Furthermore, we show that this protocol is fully compatible with EGFP immunostaining, allowing for the simultaneous detection of mRNAs and reporter proteins in transgenic animals. This protocol retains the benefits of highly sensitive, multiplexed, and multimodal detection afforded by integrating in situ HCR and SABER-FISH with immunohistochemistry, while providing a safer option for researchers, thereby offering a valuable tool for basic biology.
Meagher, N.; Hettiarachchi, D.; Hawkins, M. R.; Tavlian, S.; Spirkoska, V.; McVernon, J.; Carville, K. S.; Price, D. J.; Villanueva Cabezas, J. P.; Marcato, A. J.
Show abstract
BackgroundThe World Health Organization has developed several global template protocols for epidemiological investigations, including for household transmission investigations (HHTIs). These investigations facilitate rapid characterisation of novel or re-emerging respiratory pathogens and support evidence-based public health actions. Beyond technical readiness, community buy-in is central to the feasibility and acceptability of HHTIs. Research is needed to determine the perceived legitimacy among the community to inform local protocol adaptation and development of implementation plans that consider community attitudes and needs. MethodsIn 2025, we conducted a convenience survey of community members living in Victoria, Australia to explore: their understanding of emerging respiratory diseases; their willingness to take part in public health surveillance activities such as HHTIs; the acceptability of clinical and epidemiological data collection and respiratory/blood sample collection as main components of HHTIs, and; participant comfort towards including their companion animals in HHTIs. ResultsWe received 282 survey responses, of which 235 were included in the analysis dataset. Compared to the general Victorian population, our participants included a higher proportion of participants who reported being female, tertiary-educated, of Aboriginal and/or Torres Strait Islander heritage, born in Australia and speaking only English at home. Participants indicated overall high levels of comfort and acceptability towards participation in HHTIs, particularly in relation to clinical and epidemiological data collection, with lesser but still high levels of comfort with providing multiple respiratory specimens in a 14-day period. Participants were least comfortable with other specimens such as urine and blood. Involving companion animals in HHTIs was similarly acceptable as human-focused components. ConclusionsDespite our survey population being non-representative of the general Victorian population, our findings provide valuable descriptive insights into the acceptability of HHTIs in Victoria, Australia from which to benchmark future local and international surveys and community engagement activities.
Esele, B. A.; Oryokot, B.; Ssentongo, S. M.; Mulongo, M.; Akanyo, J.; Bongomin, F.
Show abstract
BackgroundTuberculosis (TB) remains a leading cause of morbidity and mortality among people living with HIV (PLHIV), who face a 12-fold higher risk of active TB reactivation than HIV-negative individuals. TB preventive therapy (TPT) is an effective intervention, yet TB/HIV co-infection persists at 40-45%, raising questions about the durability of a single course of TPT. This study assessed the time from TPT completion to TB diagnosis and predictors of early TB reactivation. MethodsWe conducted a retrospective case-only cohort study using routine data from Ugandas electronic medical record system, TB registers, and patient files at three TASO Centres of Excellence (Soroti, Mbale, Tororo). PLHIV on antiretroviral therapy (ART) diagnosed with TB after completing TPT between 2022-2024 were included. Participant characteristics and time to TB diagnosis were summarised descriptively; predictors of early TB were identified using logistic regression. ResultsAmong 670 participants, most were female (464, 69.3%) with mean age 51.6 years (SD 14.5). Newly diagnosed TB accounted for 638 (95.2%), including bacteriologically confirmed pulmonary TB (535, 79.9%), clinically diagnosed TB (123, 18.4%), and extrapulmonary TB (12, 1.8%). Overall, 548 (82.8%) participants were virally suppressed, with most on Dolutegravir-based regimens (641, 95.7%). Early TB occurred in 144 (21.5%), with average time to diagnosis 2.6 years. Multivariable analysis showed care at TASO Soroti was protective (aOR = 0.104, p < 0.001), while clinically diagnosed TB (aOR = 1.91, p = 0.007), shorter ART duration (<5 years: aOR = 3.07, p = 0.001; 5-10 years: aOR = 1.74, p = 0.018), and viral suppression (aOR = 1.87, p = 0.014) increased odds of early TB. ConclusionsTB can occur soon after TPT completion, with one in five PLHIV developing early disease particularly those with shorter ART duration despite viral suppression. Strengthening TB screening, continuous monitoring, and repeat TPT for high-risk groups may improve prevention.
Shet, A.; Raj, M. B.; Sannigrahi, S.; Seenappa, B.; Reddy, L.; Sharma, A. A.; Narayanan, A. G.; Satish Kumar, S.; Ganapathi, L.
Show abstract
BackgroundAdolescents and young adults with perinatally acquired HIV (APHIV) face complex psychosocial and structural challenges that may undermine resilience, a modifiable psychosocial determinant of treatment engagement, and health outcomes. Evidence on peer-led interventions targeting resilience among APHIV in South Asia remains limited. We evaluated resilience and its correlates among participants in the ImPossible Fellowship, a peer-led mentorship intervention in India. MethodsWe conducted a cross-sectional evaluation of 216 APHIV following completion of the 24-month ImPossible Fellowship in southern India in 2024. Surveys administered by trained youth investigators assessed sociodemographic, educational, and clinical characteristics. Resilience was measured using the Child and Youth Resilience Measure-Revised (CYRM-R), a validated multidimensional tool capturing personal and relational resilience dimensions. Low resilience was defined as CYRM-R threshold score [≤]33rd percentile. Multivariate logistic regression identified independent correlates of low resilience, and sensitivity analyses explored alternative CYRM-R thresholds. ResultsParticipants had a mean age of 18.7 years (range 9-24); 50% had no surviving parents, and 43% lived in child care institutions. Median resilience scores were high (74, Interquartile range [IQR] 69-78), and 91% achieved viral suppression. In multivariate analyses, three factors were independently associated with low resilience: loss of both parents (adjusted odds ratio [aOR] 4.35, 95% CI 2.09-9.06), school discontinuation (aOR 2.43, 95% CI 1.10-5.34), and self-reported communication barriers at HIV clinics (aOR 5.83, 95% CI 2.69-12.64). These associations were consistent across sensitivity analyses at alternative resilience thresholds. At the most stringent threshold of low resilience (CYRM-R score [≤]15th percentile), unsuppressed viral load also emerged as a significant correlate, suggesting that treatment failure may be concentrated among those with the most severely compromised resilience. ConclusionsAPHIV participating in a peer-led mentorship program demonstrated high overall resilience and viral suppression, but also revealed addressable vulnerabilities mapping to specific programmatic priorities. Peer-led models offer a promising foundational platform; however, complementary structural and psychosocial enhancements targeting these modifiable determinants are essential to optimize outcomes for those facing the greatest cumulative adversity.
Xu, J.; Parker, R. M. A.; Bowman, K.; Clayton, G. L.; Lawlor, D. A.
Show abstract
Background Higher levels of sedentary behaviour, such as leisure screen time (LST), and lower levels of physical activity are associated with diseases across multiple body systems which contribute to a large global health burden. Whether these associations are causal is unclear. The primary aim of this study is to investigate the causal effects of higher LST (given greater power) and, secondarily, lower moderate-to-vigorous intensity physical activity (MVPA), on a wide range of diseases in a hypothesis-free approach. Methods A two-sample Mendelian randomisation phenome-wide association study was conducted for the main analyses. Genetic single nucleotide polymorphisms (SNPs) were first selected as exposure genetic instruments for LST (hours of television watched per day; 117 SNPs) and MVPA (higher vs. lower; 18 SNPs) based on the genome-wide significant threshold (p < 5*10-8) from the largest relevant genome-wide association study (GWAS). For disease outcomes, we used summary results from FinnGen GWAS, including 1,719 diseases defined by hospital discharge International Classification of Diseases (ICD) codes in 453,733 European participants. For the main analyses, we used the inverse-variance weighting method with a Bonferroni corrected p-value of p [≤] 3.47*10-4. Sensitivity analyses included Steiger filtering, MR-Egger and weighted median analyses, and data from UK Biobank were used to explore replication. Findings Genetically predicted higher LST was associated with increased risk of 87 (5.1% of the 1,719) diseases. Most of these diseases were in musculoskeletal and connective tissue (n=37), genitourinary (n=12) and respiratory (n=8) systems. Genetic liability to lower MVPA was associated with six diseases: three in musculoskeletal and connective tissue and genitourinary systems (with greater risk of these diseases also identified with higher LST), and three in respiratory and genitourinary systems. Sensitivity analyses largely supported the main analyses. Results replicated in UK Biobank, where data available. Conclusions Higher levels of sedentary behaviour, and lower levels of physical activity, causally increase the risk of diseases across multiple body systems, making them promising targets for reducing multimorbidity.
Pietilainen, O.; Salonsalmi, A.; Rahkonen, O.; Lahelma, E.; Lallukka, T.
Show abstract
Objectives: Longer lifespans lead to longer time on retirement, despite the efforts to raise the retirement age. Therefore, it is important to study how the retirement years can be spent without diseases. This study examined socioeconomic and sociodemographic differences in healthy years spent on retirement. Methods: We followed a cohort of retired Finnish municipal employees (N=4231, average follow-up 15.4 years) on national administrative registers for major chronic diseases: cancer, coronary heart disease, cerebrovascular disease, diabetes, asthma or chronic obstructive pulmonary disease, dementia, mental disorders, and alcohol-related disorders. Median healthy years on retirement and age at first occurrence of illness (ICD-10 and ATC-based) in each combination of sex, occupational class, and age of retirement were predicted using Royston-Parmar models. Prevalence rates for each diagnostic group were calculated. Results: Most healthy years on retirement were spent by women having worked in semi-professional jobs who retired at age 60-62 (median predicted healthy years 11.6, 95% CI 10.4-12.7). The least healthy years on retirement were spent by men having worked in routine non-manual jobs who retired after age 62 (median predicted healthy years 6.5, 95% CI 4.4-9.5). Diabetes was slightly more common among lower occupational class women, and dementia among manual working women having retired at age 60-62. Discussion: Healthy years on retirement are not enjoyed equally by women and men and those who retire early or later. Policies aiming to increase the retirement age should consider the effects of these gaps on retirees and the equitability of those effects.
Hung, J.; Smith, A.
Show abstract
The global ambition to end the human immunodeficiency virus (HIV) epidemic requires understanding which system-level policy levers, enacted under the framework of Universal Health Coverage (UHC), are most effective in achieving both transmission reduction and diagnostic coverage. This study addresses an important evidence gap by quantifying the within-country association between measurable UHC policy indicators and the estimated rate of new HIV infections across nine Southeast Asian countries between 2013 and 2022. Employing a Fixed-Effects panel data methodology, the analysis controls for time-invariant national heterogeneity, ensuring reliable estimates of policy impact. We found that marginal changes in total current health expenditure (CHE) as a percentage of gross domestic product (GDP) were not statistically significantly associated with changes in HIV incidence. However, increases in the UHC Infectious Disease Service Coverage Index were statistically significantly associated with concurrent reductions in HIV incidence (p < 0.001), suggesting the efficacy of targeted service implementation as the principal driver of curbing new HIV infections. In addition, the UHC Reproductive, Maternal, Newborn, and Child Health Service Coverage Index exhibited a statistically significant positive association with changes in HIV incidence (p < 0.01), which is interpreted as a vital surveillance artefact resulting from expanded detection and reporting of previously undiagnosed HIV cases. Furthermore, out-of-pocket (OOP) health expenditure as a percentage of CHE showed a counter-intuitive negative association with changes in HIV incidence (p < 0.01), suggesting this metric primarily shows ongoing indirect cost burdens on the established patient cohort, or, alternatively, presents a diagnostic access barrier that results in lower case finding. These findings suggest that policymakers should prioritise investment in targeted infectious disease service efficacy over aggregate fiscal commitment and utilise integrated sexual health platforms for strengthened HIV surveillance and case identification.
Hassan, S. S.; Nordqvist-Kleppe, S.; Asinger, N.; Wang, J.; Dillner, J.; Arroyo Muhr, L. S.
Show abstract
Human papillomavirus (HPV) testing is the primary method for cervical cancer screening, and a negative HPV test is associated with a very low subsequent risk of invasive cancer. Nevertheless, a small number of cervical cancers are diagnosed following an HPV-negative testing result, posing challenges within HPV-based screening pathways. Using nationwide Swedish registry data of HPV testing, we identified women diagnosed with invasive cervical cancer between 2019 and 2024 and reconstructed HPV testing histories from the National Cervical Screening Registry (NKCx). The most recent HPV test prior to diagnosis was defined as the index test, and longitudinal HPV testing trajectories were classified among women with an HPV-negative index test. Of 3,000 women diagnosed with invasive cancer, 243 (8.1%) had an HPV-negative index test. These women were older at diagnosis and more frequently diagnosed at advanced stages compared with women with an HPV-positive index test. Most HPV-negative index tests (66.3%) were performed in the peri-diagnostic period (+/- 30 days). Among women with an HPV-negative index test, 52.7% (128/243) had no prior HPV testing recorded, while the remainder had consistently HPV-negative histories (33.3%, 83/243) or evidence of prior HPV positivity before the index negative test (14%, 32/243). Possible recurrent HPV positivity following an intervening negative test was rare (0.4%, 1/243). HPV-negative screening results preceding invasive cancer reflect heterogeneous screening histories and cannot be explained solely by test failure. Findings highlighting the importance of reaching women earlier in screening programs and show that fluctuating HPV detectability is rare.